131 research outputs found

    Belief merging within fragments of propositional logic

    Full text link
    Recently, belief change within the framework of fragments of propositional logic has gained increasing attention. Previous works focused on belief contraction and belief revision on the Horn fragment. However, the problem of belief merging within fragments of propositional logic has been neglected so far. This paper presents a general approach to define new merging operators derived from existing ones such that the result of merging remains in the fragment under consideration. Our approach is not limited to the case of Horn fragment but applicable to any fragment of propositional logic characterized by a closure property on the sets of models of its formulae. We study the logical properties of the proposed operators in terms of satisfaction of merging postulates, considering in particular distance-based merging operators for Horn and Krom fragments.Comment: To appear in the Proceedings of the 15th International Workshop on Non-Monotonic Reasoning (NMR 2014

    Fusion de bases propositionnelles : une méthode basée sur les R-ensembles

    No full text
    La prise de décision collective conduit à l'interaction de plusieurs agents afin d'élaborer une décision commune cohérente. D'un point de vue informatique, ce problème peut se ramener à celui de la fusion de différentes sources d'informations. Dans le domaine de la représentation des connaissances pour l'intelligence artificielle, plusieurs approches ont été proposées pour la fusion de bases de croyances propositionnelles, cependant, la plupart d'entre elles l'ont été sur un plan sémantique et sont peu utilisables en pratique. Ce papier propose une nouvelle approche syntaxique pour la fusion de bases de croyances, appelée Fusion par Rensembles (ou RSF). La notion de R-ensemble, initialement définie dans le contexte de la révision de croyances, est étendue à la fusion et la plupart des opérations classiques de fusion sont capturées syntaxiquement par RSF. Afin d'implanter efficacement RSF, ce papier montre comment RSF peut être codé en un programme logique avec sémantique des modèles stables, puis présente une adaptation du système Smodels permettant de calculer efficacement les R-ensembles. Finalement, une étude expérimentale préliminaire montre que la mise en œuvre utilisant la programmation logique avec sémantique des modèles stables semble prometteuse pour réaliser la fusion de bases de croyances sur des applications réelles. Collective decision making leads to interaction between agents in order to elaborate a consistent common decision. From a data-processing point of view, this problem can be brought back to the merging of different sources of information. In knowledge representation for artificial intelligence, several approaches have been proposed for propositional bases fusion, however, most of them are de- paper proposes a new syntactic approach of belief bases fusion, called Removed Sets Fusion (RSF). The notion of removed-set, initially defined in the context of belief revision is extended to fusion and most of the classical fusion operations are syntactically captured by RSF. In order to efficiently implement RSF, the paper shows how RSF can be encoded into a logic program with answer set semantics, then presents an adaptation of the smodels system devoted to efficiently compute the removed sets in order to perform RSF. Finally a preliminary experimental study shows that the answer set programming approach seems promising for performing belief bases fusion on real scale applications

    A merging data tool for knowledge based photogrammetry: the case study of the castle of shawbak,Jordan

    No full text
    International audienceThe present paper addresses an approach for merging heritage survey and archaeological knowledge. The theoretical framework is the integration between photogrammetric survey and documentation process, practically used in different archaeological excavation. Merging surveyed geometries and knowledge is a complex task. Many variables have to be considered during the process of merging. Photogrammetric survey results and knowledge can be actually seen as information. Information is sorted by source. A source is a set of information provided by the operators involved in the excavation process. Such operators can be archaeologists, photogrammetrists, or any other researcher (i.e. a topographist) involved in the study. The merging process involves the verification of the consistency of different sources and the aggregation of all the information from the sources into a global result. Each source, respectively each operator, owns a personal representation of his knowledge domain, a photogrammetrist uses geometrical primitive and 3D representations of the object surveyed, an archaeologist has a textual and semantic representation of the objects. Merging together all these sets of information needs a tool which can be easily operated by most of the participants in the research and which can furthermore manage the ‘multiple knowledge' on the surveyed object. This tool, called Ametist, an acronym standing for Arpenteur ManagEment Tool for Interactive Survey Treatment, uses a simple interface for displaying results and knowledge in various form (textual, 2D map, 3D scene, XML). This tool can make an automatic merging of the “multiple knowledge” and its merge engine can solve conflicts (object identification mismatch, measure of an object taken several times, spatial collisions etc.). When conflicts cannot automatically be solved the application can report about inconsistency errors and ask a user to manually correct the information involved. As inconsistency can be present in any information, all operators have to be able to use the interface. The tool provides a simple easy to use interface. This document will first address the concept of knowledge based photogrammetry (with ARPENTEUR) and then deal with a presentation of ‘Ametist'. Finally, a real case study will be considered to highlight the first results of such a system in the frame of a French Italian scientific partnership with the “Dipartimento di Studi storici e Geografici” of the University of Florence, in charge of the archaeological research. The selected case study is the Castle of Shawbak, in Jordan, known in medieval written sources as the “Crac de Montréal”

    Lexicographic Inference for Partially Ordered Belief Bases

    Get PDF
    International audienceCoherence-based approaches are quite popular to reason under inconsistency. Most of them are defined with respect to totally preordered belief bases such as the lexicographic inference which is known to have desirable properties from theoretical, practical and psychological points of view. However, partially preordered belief bases offer much more flexibility to represent efficiently incomplete knowledge and to avoid comparing unrelated pieces of information. In this paper, we propose a lexicographic inference for partially preordered belief bases that extends the classical one. On one hand, we define a natural inference relation which con- sists in applying classical lexicographic inference from all compatible totally preordered belief bases. On the other hand, we propose a novel cardinality-based preorder between consistent subbases. This cardinality- based preorder can be indifferently applied on partially or totally preordered belief bases. Then, applying classical inference on the preferred consistent subbases, according to this preorder, provides another lexicographic inference relation for partially preordered belief bases. Interestingly enough, we show that the second inference is covered by the first one. Lastly, a semantic characterization of these two definitions is provided

    Surveying medieval archaeology: a new form for Harris paradigm linking photogrammetry and temporal relations

    Get PDF
    The paper presents some reflexions concerning an interdisciplinary project between Medieval Archaeologists from the University of Florence (Italy) and ICT researchers from CNRS LSIS of Marseille (France), aiming towards a connection between 3D spatial representation and archaeological knowledge. It is well known that Laser Scanner, Photogrammetry and Computer Vision are very attractive tools for archaeologists, although the integration of representation of space and representation of archaeological time has not yet found a methodological standard of reference. We try to develop an integrated system for archaeological 3D survey and all other types of archaeological data and knowledge through integrating observable (material) and non-graphic (interpretive) data. Survey plays a central role, since it is both a metric representation of the archaeological site and, to a wider extent, an interpretation of it (being also a common basis for communication between the 2 teams). More specifically 3D survey is crucial, allowing archaeologists to connect actual spatial assets to the stratigraphic formation processes (i.e. to the archaeological time) and to translate spatial observations into historical interpretation of the site. We propose a common formalism for describing photogrammetrical survey and archaeological knowledge stemming from ontologies: Indeed, ontologies are fully used to model and store 3D data and archaeological knowledge. Xe equip this formalism with a qualitative representation of time. Stratigraphic analyses (both of excavated deposits and of upstanding structures) are closely related to E. C. Harris theory of "Stratigraphic Unit" ("US" from now on). Every US is connected to the others by geometric, topological and, eventually, temporal links, and are recorded by the 3D photogrammetric survey. However, the limitations of the Harris Matrix approach lead to use another representation formalism for stratigraphic relationships, namely Qualitative Constraints Networks (QCN) successfully used in the domain of knowledge representation and reasoning in artificial intelligence for representing temporal relations

    A General Modifier-based Framework for Inconsistency-Tolerant Query Answering

    Full text link
    We propose a general framework for inconsistency-tolerant query answering within existential rule setting. This framework unifies the main semantics proposed by the state of art and introduces new ones based on cardinality and majority principles. It relies on two key notions: modifiers and inference strategies. An inconsistency-tolerant semantics is seen as a composite modifier plus an inference strategy. We compare the obtained semantics from a productivity point of view

    A Semantic Characterization for ASP Base Revision

    Get PDF
    International audienceThe paper deals with base revision for Answer Set Programming (ASP). Base revision in classical logic is done by the removal of formulas. Exploiting the non-monotonicity of ASP allows one to propose other revision strategies, namely addition strategy or removal and/or addition strategy. These strategies allow one to define families of rule-based revision operators. The paper presents a semantic characterization of these families of revision operators in terms of answer sets. This semantic characterization allows for equivalently considering the evolution of syntactic logic programs and the evolution of their semantic content. It then studies the logical properties of the proposed operators and gives complexity results

    Computing Query Answering With Non-Monotonic Rules: A Case Study of Archaeology Qualitative Spatial Reasoning

    Get PDF
    International audienceThis paper deals with querying ontology-based knowledge bases equipped with non-monotonic rules through a case study within the framework of Cultural Heritage. It focuses on 3D underwater surveys on the Xlendi wreck which is represented by an OWL2 knowledge base with a large dataset. The paper aims at improving the interactions between the archaeologists and the knowledge base providing new queries that involve non-monotonic rules in order to perform qualitative spatial reasoning. To this end, the knowledge base initially represented in OWL2-QL is translated into an equivalent Answer Set Programming (ASP) program and is enriched with a set of non-monotonic ASP rules suitable to express default and exceptions. An ASP query answering approach is proposed and implemented. Furthermore due to the increased expressiveness of non-monotonic rules it provides spatial reasoning and spatial relations between artifacts query answering which is not possible with query answering languages such as SPARQL and SQWRL
    corecore